2,305 research outputs found

    Economics and Engineering for Preserving Digital Content

    Get PDF
    Progress towards practical long-term preservation seems to be stalled. Preservationists cannot afford specially developed technology, but must exploit what is created for the marketplace. Economic and technical facts suggest that most preservation ork should be shifted from repository institutions to information producers and consumers. Prior publications describe solutions for all known conceptual challenges of preserving a single digital object, but do not deal with software development or scaling to large collections. Much of the document handling software needed is available. It has, however, not yet been selected, adapted, integrated, or deployed for digital preservation. The daily tools of both information producers and information consumers can be extended to embed preservation packaging without much burdening these users. We describe a practical strategy for detailed design and implementation. Document handling is intrinsically complicated because of human sensitivity to communication nuances. Our engineering section therefore starts by discussing how project managers can master the many pertinent details.

    Invest to Save: Report and Recommendations of the NSF-DELOS Working Group on Digital Archiving and Preservation

    Get PDF
    Digital archiving and preservation are important areas for research and development, but there is no agreed upon set of priorities or coherent plan for research in this area. Research projects in this area tend to be small and driven by particular institutional problems or concerns. As a consequence, proposed solutions from experimental projects and prototypes tend not to scale to millions of digital objects, nor do the results from disparate projects readily build on each other. It is also unclear whether it is worthwhile to seek general solutions or whether different strategies are needed for different types of digital objects and collections. The lack of coordination in both research and development means that there are some areas where researchers are reinventing the wheel while other areas are neglected. Digital archiving and preservation is an area that will benefit from an exercise in analysis, priority setting, and planning for future research. The WG aims to survey current research activities, identify gaps, and develop a white paper proposing future research directions in the area of digital preservation. Some of the potential areas for research include repository architectures and inter-operability among digital archives; automated tools for capture, ingest, and normalization of digital objects; and harmonization of preservation formats and metadata. There can also be opportunities for development of commercial products in the areas of mass storage systems, repositories and repository management systems, and data management software and tools.

    Reducing Zero-point Systematics in Dark Energy Supernova Experiments

    Get PDF
    We study the effect of filter zero-point uncertainties on future supernova dark energy missions. Fitting for calibration parameters using simultaneous analysis of all Type Ia supernova standard candles achieves a significant improvement over more traditional fit methods. This conclusion is robust under diverse experimental configurations (number of observed supernovae, maximum survey redshift, inclusion of additional systematics). This approach to supernova fitting considerably eases otherwise stringent mission calibration requirements. As an example we simulate a space-based mission based on the proposed JDEM satellite; however the method and conclusions are general and valid for any future supernova dark energy mission, ground or space-based.Comment: 30 pages,8 figures, 5 table, one reference added, submitted to Astroparticle Physic

    Weak Lensing from Space I: Instrumentation and Survey Strategy

    Full text link
    A wide field space-based imaging telescope is necessary to fully exploit the technique of observing dark matter via weak gravitational lensing. This first paper in a three part series outlines the survey strategies and relevant instrumental parameters for such a mission. As a concrete example of hardware design, we consider the proposed Supernova/Acceleration Probe (SNAP). Using SNAP engineering models, we quantify the major contributions to this telescope's Point Spread Function (PSF). These PSF contributions are relevant to any similar wide field space telescope. We further show that the PSF of SNAP or a similar telescope will be smaller than current ground-based PSFs, and more isotropic and stable over time than the PSF of the Hubble Space Telescope. We outline survey strategies for two different regimes - a ``wide'' 300 square degree survey and a ``deep'' 15 square degree survey that will accomplish various weak lensing goals including statistical studies and dark matter mapping.Comment: 25 pages, 8 figures, 1 table, replaced with Published Versio

    Supernova / Acceleration Probe: A Satellite Experiment to Study the Nature of the Dark Energy

    Full text link
    The Supernova / Acceleration Probe (SNAP) is a proposed space-based experiment designed to study the dark energy and alternative explanations of the acceleration of the Universe's expansion by performing a series of complementary systematics-controlled measurements. We describe a self-consistent reference mission design for building a Type Ia supernova Hubble diagram and for performing a wide-area weak gravitational lensing study. A 2-m wide-field telescope feeds a focal plane consisting of a 0.7 square-degree imager tiled with equal areas of optical CCDs and near infrared sensors, and a high-efficiency low-resolution integral field spectrograph. The SNAP mission will obtain high-signal-to-noise calibrated light-curves and spectra for several thousand supernovae at redshifts between z=0.1 and 1.7. A wide-field survey covering one thousand square degrees resolves ~100 galaxies per square arcminute. If we assume we live in a cosmological-constant-dominated Universe, the matter density, dark energy density, and flatness of space can all be measured with SNAP supernova and weak-lensing measurements to a systematics-limited accuracy of 1%. For a flat universe, the density-to-pressure ratio of dark energy can be similarly measured to 5% for the present value w0 and ~0.1 for the time variation w'. The large survey area, depth, spatial resolution, time-sampling, and nine-band optical to NIR photometry will support additional independent and/or complementary dark-energy measurement approaches as well as a broad range of auxiliary science programs. (Abridged)Comment: 40 pages, 18 figures, submitted to PASP, http://snap.lbl.go

    Mercury and monomethylmercury in fluids from Sea Cliff submarine hydrothermal field, Gorda Ridge

    Get PDF
    Author Posting. © American Geophysical Union, 2006. This article is posted here by permission of American Geophysical Union for personal use, not for redistribution. The definitive version was published in Geophysical Research Letters 33 (2006): L17606, doi:10.1029/2006GL026321.Submarine hydrothermal systems are hypothesized to be a potentially important source of monomethylmercury (MMHg) to the ocean, yet the amount of MMHg in vent fluids is unknown. Here, we report total Hg and MMHg concentrations in hydrothermal vent fluids sampled from the Sea Cliff site on the Gorda Ridge. MMHg is the dominant Hg species, and levels of total Hg are enhanced slightly compared to seawater. Hg is enriched in deposits surrounding the site, suggesting near-field deposition from fluid plumes, with rapid MMHg demethylation and scavenging of Hg(II) complexes. Assuming the flux of MMHg from Sea Cliff is representative of global submarine hydrothermal inputs, we estimate a flux of 0.1–0.4 Mmoles y−1, which may be attenuated by scavenging near the vents. However, deep waters are not typically known to be elevated in Hg, and thus we suggest that hydrothermal systems are not significant sources of MMHg to commercial fisheries.WHOI Academic Programs Office, the Penzance Endowed Discretionary Fund, NSF-OCE and EPA-STAR, NOAA-NUR

    Evidence for the η_b(1S) Meson in Radiative ΄(2S) Decay

    Get PDF
    We have performed a search for the η_b(1S) meson in the radiative decay of the ΄(2S) resonance using a sample of 91.6 × 10^6 ΄(2S) events recorded with the BABAR detector at the PEP-II B factory at the SLAC National Accelerator Laboratory. We observe a peak in the photon energy spectrum at E_Îł = 609.3^(+4.6)_(-4.5)(stat)±1.9(syst) MeV, corresponding to an η_b(1S) mass of 9394.2^(+4.8)_(-4.9)(stat) ± 2.0(syst) MeV/c^2. The branching fraction for the decay ΄(2S) → γη_b(1S) is determined to be [3.9 ± 1.1(stat)^(+1.1)_(-0.9)(syst)] × 10^(-4). We find the ratio of branching fractions B[΄(2S) → γη_b(1S)]/B[΄(3S) → γη_b(1S)]= 0.82 ± 0.24(stat)^(+0.20)_(-0.19)(syst)

    Search for charged Higgs decays of the top quark using hadronic tau decays

    Full text link
    We present the result of a search for charged Higgs decays of the top quark, produced in ppˉp\bar{p} collisions at √s=\surd s = 1.8 TeV. When the charged Higgs is heavy and decays to a tau lepton, which subsequently decays hadronically, the resulting events have a unique signature: large missing transverse energy and the low-charged-multiplicity tau. Data collected in the period 1992-1993 at the Collider Detector at Fermilab, corresponding to 18.7±\pm0.7~pb−1^{-1}, exclude new regions of combined top quark and charged Higgs mass, in extensions to the standard model with two Higgs doublets.Comment: uuencoded, gzipped tar file of LaTeX and 6 Postscript figures; 11 pp; submitted to Phys. Rev.

    Inclusive jet cross section in pˉp{\bar p p} collisions at s=1.8\sqrt{s}=1.8 TeV

    Full text link
    The inclusive jet differential cross section has been measured for jet transverse energies, ETE_T, from 15 to 440 GeV, in the pseudorapidity region 0.1â‰€âˆŁÎ·âˆŁâ‰€\leq | \eta| \leq 0.7. The results are based on 19.5 pb−1^{-1} of data collected by the CDF collaboration at the Fermilab Tevatron collider. The data are compared with QCD predictions for various sets of parton distribution functions. The cross section for jets with ET>200E_T>200 GeV is significantly higher than current predictions based on O(αs3\alpha_s^3) perturbative QCD calculations. Various possible explanations for the high-ETE_T excess are discussed.Comment: 8 pages with 2 eps uu-encoded figures Submitted to Physical Review Letter
    • 

    corecore